It wasn't until I got into my 30's and began to read and learn more about health that I started to become suspicious of the anti-sun propaganda that seemed to be cropping up everywhere.
How could the sun be so bad for us? The sun gives life and energy to just about everything on earth, so why were we all being told to avoid it at all costs? Are the sun's rays really "rays of death," as some media coverage would suggest, or is there something we are missing?
The truth, as with most truths when it comes to real health, is more complicated...